Scaling-Rotation Distance and Interpolation of Symmetric Positive-Definite Matrices
نویسندگان
چکیده
We introduce a new geometric framework for the set of symmetric positive-definite (SPD) matrices, aimed at characterizing deformations of SPD matrices by individual scaling of eigenvalues and rotation of eigenvectors of the SPD matrices. To characterize the deformation, the eigenvalue-eigenvector decomposition is used to find alternative representations of SPD matrices and to form a Riemannian manifold so that scaling and rotations of SPD matrices are captured by geodesics on this manifold. The problems of nonunique eigen-decompositions and eigenvalue multiplicities are addressed by finding minimal-length geodesics, which gives rise to a distance and an interpolation method for SPD matrices. Computational procedures for evaluating the minimal scaling-rotation deformations and distances are provided for the most useful cases of 2 × 2 and 3 × 3 SPD matrices. In the new geometric framework, minimal scaling-rotation curves interpolate eigenvalues at constant logarithmic rate, and eigenvectors at constant angular rate. In the context of diffusion tensor imaging, this results in better behavior of the trace, determinant, and fractional anisotropy of interpolated SPD matrices in typical cases.
منابع مشابه
Sparse and low-rank approximations of large symmetric matrices using biharmonic interpolation
Geodesic distance matrices can reveal shape properties that are largely invariant to non-rigid deformations, and thus are often used to analyze and represent 3-D shapes. However, these matrices grow quadratically with the number of points. Thus for large point sets it is common to use a low-rank approximation to the distance matrix, which fits in memory and can be efficiently analyzed using met...
متن کاملScaling symmetric positive definite matrices to prescribed row sums
We give a constructive proof of a theorem of Marshall and Olkin that any real symmetric positive definite matrix can be symmetrically scaled by a positive diagonal matrix to have arbitrary positive row sums. We give a slight extension of the result, showing that given a sign pattern, there is a unique diagonal scaling with that sign pattern, and we give upper and lower bounds on the entries of ...
متن کاملRiemannian Metric Learning for Symmetric Positive Definite Matrices
Over the past few years, symmetric positive definite matrices (SPD) have been receiving considerable attention from computer vision community. Though various distance measures have been proposed in the past for comparing SPD matrices, the two most widely-used measures are affine-invariant distance and log-Euclidean distance. This is because these two measures are true geodesic distances induced...
متن کاملSymmetric Positive-Definite Matrices: From Geometry to Applications and Visualization
In many engineering applications that use tensor analysis, such as tensor imaging, the underlying tensors have the characteristic of being positive definite. It might therefore be more appropriate to use techniques specially adapted to such tensors. We will describe the geometry and calculus on the Riemannian symmetric space of positive-definite tensors. First, we will explain why the geometry,...
متن کاملOn Lyapunov Scaling Factors of Real Symmetric Matrices
A real square matrix A is said to be Lyapunov diagonally semistable if there exists a positive definite diagonal matrix D, called a Lyapunov scaling factor of A, such that the matrix AD + DAT is positive semidefinite, Lyapunov diagonally semistable matrices play an important role in applications in several disciplines, and have been studied in many matrix theoretical papers, see for example [2]...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM J. Matrix Analysis Applications
دوره 36 شماره
صفحات -
تاریخ انتشار 2015